The deluge of networked data motivates the development of algorithms forcomputation- and communication-efficient information processing. In thiscontext, three data-adaptive censoring strategies are introduced toconsiderably reduce the computation and communication overhead of decentralizedrecursive least-squares (D-RLS) solvers. The first relies on alternatingminimization and the stochastic Newton iteration to minimize a network-widecost, which discards observations with small innovations. In the resultantalgorithm, each node performs local data-adaptive censoring to reducecomputations, while exchanging its local estimate with neighbors so as toconsent on a network-wide solution. The communication cost is further reducedby the second strategy, which prevents a node from transmitting its localestimate to neighbors when the innovation it induces to incoming data isminimal. In the third strategy, not only transmitting, but also receivingestimates from neighbors is prohibited when data-adaptive censoring is ineffect. For all strategies, a simple criterion is provided for selecting thethreshold of innovation to reach a prescribed average data reduction. The novelcensoring-based (C)D-RLS algorithms are proved convergent to the optimalargument in the mean-square deviation sense. Numerical experiments validate theeffectiveness of the proposed algorithms in reducing computation andcommunication overhead.
展开▼